![]()
专利摘要:
A mobile terminal (100) according to the present disclosure may include a body; a wireless communication unit (110) configured to communicate with an external communication device; and a touch screen (151) formed to display a home screen and detect a touch; and a controller (180) configured to control the touch screen (151) to display a control screen for controlling at least a portion of external devices at a location where the body is located in accordance with the connected external communication device via the wireless communication unit (110) when a predetermined type of touch is applied to the touch screen (151) while displaying the home screen. 公开号:FR3041447A1 申请号:FR1658345 申请日:2016-09-08 公开日:2017-03-24 发明作者:Seonhwi Cho;Donghoe Kim;Wanho Ju;Sungchae Na 申请人:LG Electronics Inc; IPC主号:
专利说明:
MOBILE TERMINAL AND ASSOCIATED CONTROL METHOD The present disclosure relates to a mobile terminal for performing communication with an external communication device and an associated control method. The terminals can be divided into mobile / portable terminals and stationary terminals according to their mobility. Also, mobile terminals can be classified into portable terminals and on-vehicle terminals depending on whether a user can directly wear them. As it becomes multifunctional, a mobile terminal can capture inanimate images or moving images, play music or video files, play games, receive broadcast and the like, to be implemented as an integrated media player. Efforts are underway to support and improve the functionality of mobile terminals. Such efforts include software and hardware enhancements, as well as changes and improvements to the structural components. As an example for improvement in the software domain, a technical development associated with the application of Internet of objects on a terminal has been carried out. Embodiments will be described in detail with reference to the following drawings in which like reference numerals refer to like elements, in which: FIG. 1A is a block diagram for explaining a mobile terminal associated with the present disclosure, and Figures 1B and 1C are conceptual views on which a mobile terminal associated with the present disclosure is viewed from different directions; Fig. 2A is a flowchart for explaining a control method associated with a predetermined location according to the present disclosure, and Figs. 2B and 2C are conceptual views for explaining the control method of Fig. 2A; Figs. 3A and 3B are conceptual views for explaining a three-dimensional map image included in a control screen associated with the present disclosure; Figs. 4A-41 are conceptual views for explaining a control method for mutual conversion between various maps included in a control screen associated with the present disclosure; Figs. 5A and 5B are conceptual views for explaining a control method for displaying status information or notification information of an external device using an image object associated with the present disclosure. In addition, Figures 5C and 5D are conceptual views for explaining a method of controlling an external device using an image object; Figs. 6A-6E are conceptual views for explaining a method of controlling an external device based on input by the user, inputted through a window associated with the present disclosure; Figs. 7A and 7B are conceptual views for explaining a window for controlling a plurality of external devices associated with the present disclosure; Figs. 8A and 8B are conceptual views for explaining a control method associated with the end of a predetermined location control screen associated with the present disclosure; Figs. 9A to 9E are conceptual views for explaining various embodiments of displaying information about an event occurring on an external device associated with the present disclosure; and Figs. 10A to 10D are conceptual views for explaining a method of displaying the notification information of an external device when a notification mode according to an embodiment of the present disclosure is set to a sensitive mode. A description will now be provided in detail according to the illustrative embodiments disclosed herein, with reference to the accompanying drawings. For a brief description with reference to the drawings, the same reference numbers are provided to the same equivalent components and components, and the description thereof will not be repeated. A suffix "module" and "unit" used for constituent elements described in the following description is simply intended for easy description of the memory, and the suffix itself does not give any meaning or special function. In describing the present disclosure, if a detailed explanation for a related known function or construction is considered to diverge unnecessarily from the general idea of the present disclosure, such an explanation has been omitted, but would be understood by those skilled in the art. The accompanying drawings are used to assist in an easy understanding of the technical idea of the present disclosure and it is to be understood that the idea of the present disclosure is not limited by the accompanying drawings. The idea of this disclosure is to be construed to extend to any modifications, equivalents and substitutes in addition to the accompanying drawings. The mobile terminals described herein may include cell phones, smart phones, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, slate PCs, tablet PCs, ultra-portable computers, portable devices (eg, smart watches, smart glasses, head-mounted (HMD)), and the like. However, it can be readily understood by those skilled in the art that the configuration according to the illustrative embodiments of this memo may also be applied to stationary terminals such as digital TVs, desktops and the like, excluding a case to be applicable only to mobile terminals. With reference to Figs. 1A-1C, Fig. 1A is a block diagram of a mobile terminal for explaining a mobile terminal associated with the present disclosure, and Figs. 1B and 1C are conceptual views on which an example of the mobile terminal is seen from different directions. The mobile terminal 100 may include components, such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, an electric power supply unit 190 and the like. Fig. 1A illustrates the mobile terminal having various components, but it can be understood that the implementation of all of the illustrated components is not a necessity. More or less components may alternatively be implemented. In more detail, the wireless communication unit 110 of these components may typically include one or more modules that allow wireless communication between the mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal. mobile terminal 100, or between the mobile terminal 100 and a network within which another mobile terminal 100 (or an external server) is located. For example, the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-distance communication module 114, a location information module 115 and the like. The input unit 120 may include a picture capture apparatus 121 for inputting an image signal, a microphone 122 or an audio input module for inputting an audio signal, or a user input unit 123 (for example: for example, a touch key, a push button (or a mechanical key), etc.) to allow a user to enter information. Audio data or image data collected by the input unit 120 may be analyzed and processed by a user control command. The detection unit 140 may include at least one sensor that detects at least one of internal information of the mobile terminal, an environment close to the mobile terminal and user information. For example, the detection unit 140 may include a proximity sensor 141, a lighting sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyro sensor, a sensor movement, a RGB sensor, an infrared (IR) sensor, a digital scanning sensor, an ultrasonic sensor, an optical sensor (for example, refer to the camera 121), a microphone (refer to number reference 122), a battery meter, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor, a gas sensor, etc.), and a chemical sensor (for example, an electronic nose, a health sensor, a biometric sensor, etc.). On the other hand, the mobile terminal described herein can use information in such a way as to combine information detected by at least two sensors of these sensors. Output unit 150 may be configured to output an audio signal, a video signal, or a touch signal. The output unit 150 may include a display unit 151, an audio output module 152, a haptic module 153, an optical output module 154, and the like. The display unit 151 may have an inter-layer structure or an integrated structure with a touch sensor to implement a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as serve as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. The interface unit 160 may interface with various types of external devices connected with the mobile terminal 100. The interface unit 160, for example, may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, input / output ports (I / O) audio, video I / O ports, headphone ports, or the like. The mobile terminal 100 may perform an appropriate control associated with an externally connected device in response to the external device connected to the interface unit 160. The memory 170 may store a plurality of application programs (or applications) executed in the mobile terminal 100, data for mobile terminal operations, instruction words, and the like. At least some of these application programs can be downloaded from an external server via wireless communication. Some other such application programs may be installed inside the mobile terminal 100 at the time of shipment for basic functions of the mobile terminal 100 (for example, receiving a call, performing a call, receiving a message, sending a message, etc.). On the other hand, the application programs can be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) of the mobile terminal 100. The controller 180 may typically control a general operation of the mobile terminal 100 in addition to the operations associated with the application programs. The controller 180 may provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are inputted or outputted by the aforementioned components, or by activating the stored application programs in the memory 170. The controller 180 can control at least a portion of the components illustrated in FIG. 1, in order to control the application programs stored in the memory 170. In addition, the controller 180 can control the application programs by combining at least two of the components included in the mobile terminal 100 for operation. The electric power supply unit 190 may receive external electric power or internal electrical energy and provide appropriate electrical energy required to operate respective elements and components included in the mobile terminal 100 under the Control of the controller 180. The power supply unit 190 may include a battery, and the battery may be a built-in battery or a replaceable battery. At least a portion of these elements and components may be combined to implement a mobile terminal operation and control or a mobile terminal control method according to various illustrative embodiments described herein. Also, the operation and the control or the control method of the mobile terminal can be implemented in the mobile terminal by activating at least one application program stored in the memory 170. Hereinafter, each aforementioned component will be described in more detail with reference to FIG. 1A, before explaining various illustrative embodiments implemented by the mobile terminal 100 having the configuration. First, by considering the wireless communication unit 110, the broadcast receiving module 111 of the wireless communication unit 110 can receive a broadcast signal and / or associated broadcast information from a wireless communication unit 110. external broadcast management entity through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. At least two broadcast reception modules 111 may be provided in the mobile terminal 100 to simultaneously receive at least two broadcast channels or to switch the broadcast channels. The mobile communication module 112 may transmit / receive wireless signals to at least one / from at least one of network entities, for example, a base station, a mobile external terminal, a server, and the like , on a mobile communication network, which is constructed according to technical standards or transmission methods for mobile communications (eg global mobile communication system (GSM), code division multiple access (CDMA), CDMA Broadband (WCDMA), High Speed Downlink Packet Access (HSDPA), Long Term Evolution (LTE), etc.) Here, the wireless signals may include an audio call signal, a video call signal (telephony), or various data formats depending on the transmission / reception of text / multimedia messages. The wireless Internet module 113 denotes a module for wireless Internet access. This module can be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 can transmit / receive wireless signals over communication networks using wireless Internet technologies. Examples of such wireless internet access may include wireless LAN (WLAN), Wi-Fi Direct, DLNA, Wibro, Wimax, HSDPA, LTE, and the like. The wireless Internet module 113 can transmit / receive data according to at least one wireless Internet technology within a range including even Internet technologies that are not mentioned above. From the point of view that wireless Internet access according to Wibro, HSDPA, GSM, CDMA, WCDMA, LTE and the like is executed via a mobile communication network, the wireless Internet module 113 which realizes the Internet access wirelessly via the mobile communication network can be understood as being a type of the mobile communication module 112. The short-range communication module 114 denotes a module for short-distance communications. Suitable technologies for implementing short-range communications can include BLUETOOTH ™, Radio Frequency Identification (RFID), IrDA, Ultra-Wide Band (UWB), ZigBee, Near Field Communication (NFC), Wi-Fi, Wi-Fi Fi Direct, and the like. The short-range communication module 114 can support wireless communications between the mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal and a network where another mobile terminal 100 (or an external server) is located, via wireless personal networks. Here, the other mobile terminal 100 may be a portable device, for example, a smart watch, smart glasses or a headset (HMD), which is capable of exchanging data with the mobile terminal 100 (or cooperating with the mobile terminal). mobile terminal 100). The short-distance communication module 114 can detect (recognize) a portable device, which is capable of communicating with the mobile terminal), near the mobile terminal 100. In addition, when the detected portable device is a device that is authenticated to communicate with the mobile terminal 100 according to the present disclosure, the controller 180 can transmit at least a portion of processed data in the mobile terminal 100 to the portable device via the short-distance communication module 114. Thus, a user of the portable device can use the processed data in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. Also, when a message is received in the mobile terminal 100, the user can check the received message using the portable device. The location information module 115 denotes a module for detecting or calculating a position of the mobile terminal. An example of the location information module 115 may include a GPS module or a Wi-Fi module. For example, when the mobile terminal uses the GPS module, a position of the mobile terminal may be acquired using a signal sent from the mobile terminal. a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information related to a wireless access point (AP) which transmits or receives a signal without at or from the Wi-Fi module. If necessary, the location information module 115 may perform any function of another module of the wireless communication unit 110 to obtain data for the location of the mobile terminal in a substitutional or additional manner. The location information module 115 may be a module used to acquire the position (or current position) of the mobile terminal, and may not necessarily be limited to a module to directly calculate or acquire the position of the mobile terminal. Hereinafter, the input unit 120 will be described in more detail. The input unit 120 may be configured to provide an input (or information) of audio or video signal to the mobile terminal or information entered by a user to the mobile terminal. For the input of the audio information, the mobile terminal 100 may include one or a plurality of image capturing devices 121. The image capturing apparatus 121 may process image views of inanimate or video images obtained by image sensors in a video call mode or a capture mode. The processed image views may be displayed on the display unit 151. On the other hand, the plurality of image capturing devices 121 disposed in the mobile terminal 100 may be arranged in a matrix configuration. By using the image capture devices 121 having the matrix configuration, a plurality of image information having various angles or focal points can be input to the mobile terminal 100. Also, the plurality of image capturing devices 121 can be arranged in a stereoscopic structure to acquire a left image and a right image to implement a stereoscopic image. The microphone 122 can process an external audio signal into electrical audio data. The processed audio data may be used in a variety of ways depending on a function performed in the mobile terminal 100 (or an executed application program). On the other hand, the microphone 122 may include assorted noise elimination algorithms to eliminate the noise generated during the reception of the external audio signal. The user input unit 123 may receive information entered by a user. When information is inputted through the user input unit 123, the controller 180 may control an operation of the mobile terminal 100 to match the information entered. The user input unit 123 may include a mechanical input element (or a mechanical key, for example, a button located on a front / rear surface or a side surface of the mobile terminal 100, a curved switch, pulse, a pulse switch, etc.), and a touch input means. For example, the touch input means may be a virtual key, a programmable key or a visual key, which is displayed on a touch screen by software processing, or a touch key that is arranged on a portion to the exception of the touch screen. On the other hand, the virtual key or the visual key can be displayed on the touch screen in various forms, for example, graphic, textual, iconic, video or an association thereof. The detection unit 140 can detect at least one of internal information of the mobile terminal, environment information close to the mobile terminal and user information, and generate a detection signal corresponding thereto. The controller 180 may control an operation of the mobile terminal 100 or execute a data processing, a function or an operation associated with an application program installed in the mobile terminal depending on the detection signal. Hereinafter, a description will be given in more detail of sensors representative of various sensors that may be included in the detection unit 140. First, a proximity sensor 141 refers to a sensor for detecting the presence or the absence of an object, approaching a surface, intended to be detected, or an object, disposed near a surface, intended to be detected, by using a magnetic field or infrared rays without mechanical contact . The proximity sensor 141 may be arranged in an interior region of the mobile terminal covered by the touch screen, or near the touch screen. The proximity sensor 141 may have a longer life and utility than a contact sensor. The proximity sensor 141, for example, may include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a proximity sensor. magnetic, infrared proximity sensor, and so on. When the touch screen is implemented in capacitive form, the proximity sensor 141 can detect the proximity of a pointer to the touch screen by changes in a magnetic field, which responds to an approach of an object with conductivity. In this case, the touch screen (touch sensor) can be categorized as a proximity sensor. Hereinafter, for a brief explanation, a state in which the pointer is positioned to be nearby on the touchless touch screen will be called "touch in proximity", while a state in which the pointer substantially contacts the touch screen will be called "touch contact". For the position corresponding to the touch near the pointer on the touch screen, such a position will correspond to a position where the pointer is oriented perpendicularly to the touch screen when touching in proximity to the pointer. The proximity sensor 141 can detect a touch in proximity, and touch profiles in proximity (eg, distance, direction, speed, time, position, state of motion, etc.). On the other hand, the controller 180 can process data (or information) corresponding to the near-touch and proximity-touch profiles detected by the proximity sensor 141, and output visual information corresponding to the processing data on the proximity sensor 141. the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data (or information) depending on whether a touch relative to the same point on the touch screen is a touch in proximity or a touch to contact. A touch sensor may detect a touch (or touch input) applied to the touch screen (or display unit 151) using at least one of a variety of types of touch methods, such as a resistive type, a capacitive type, an infrared type, a magnetic type, and the like. By way of example, the touch sensor may be configured to convert changes of a pressure applied to a specific portion of the display unit 151 or a capacitance occurring from a specific portion of the unit. display 151, as electrical input signals. Also, the touch sensor can be configured to detect not only an affected position and an affected area, but also tactile pressure. Here, a touch object is an object for applying a touch input to the touch sensor. Examples of the touch object may include a finger, a touch pen, a stylus, a pointer or the like. When touch inputs are detected by the touch sensors, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals, and then transmit corresponding data to the controller 180. Therefore, the controller 180 can detect the region of the display unit 151 that has been touched. Here, the touch controller may be a separate component of the controller 180 or the controller 180 itself. On the other hand, the controller 180 may perform a different control or the same control depending on a type of an object that touches the touch screen (or a touch key provided in addition to the touch screen). The fact that the different control or control must be performed according to the object that provides a touch input can be decided according to a current state of operation of the mobile terminal 100 or an application program currently being executed. Meanwhile, the touch sensor and the proximity sensor can be executed individually or in combination, to detect various types of touch, such as a short touch (or tap), a long touch, a multi-touch, a touch-slide, fast-hitting, close-in pinching, spreading nip, sliding, floating, and the like. An ultrasonic sensor may be configured to recognize positional information about a detection object using ultrasonic waves. The controller 180 can calculate a position of a wave generation source based on information detected by an illumination sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, a time for the light to arrive at the optical sensor can be much shorter than a time for the ultrasonic wave to reach the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. In more detail, the position of the wave generation source can be calculated using a time difference with respect to time for the ultrasonic wave to arrive as a function of the light as a reference signal. The image capture apparatus 121 constituting the input unit 120 may be a type of image capturing camera sensor. The image capture apparatus sensor may include at least one of a photo sensor and a laser sensor. The image capture apparatus 121 and the laser sensor may be associated to detect a touch of the detection object with respect to a 3D stereoscopic image. The photo-sensor can be laminated on the display device. The photo sensor can be configured to scan a motion of the sensing object near the touch screen. In more detail, the photo-sensor may include photodiodes and row and column transistors for scanning content placed on the photo-sensor using an electrical signal that changes according to the amount of light applied. Namely, the photo-sensor can calculate the coordinates of the detection object according to a variation of light to thereby obtain position information of the detection object. The display unit 151 may output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the screen execution information. The display unit 151 may also be implemented as a stereoscopic display unit for displaying stereoscopic images. The stereoscopic display unit 152 may employ a stereoscopic display system such as a stereoscopic system (a spectacle system), an auto-stereoscopic system (a system without glasses), a projection system (holographic system), or like. The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call waiting receiving mode, a calling mode, a calling mode a voice recognition mode, a broadcast reception mode, and the like. Also, the audio output module 152 may also provide audible output signals related to a particular function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal. The audio output module 152 may include a receiver, a speaker, a buzzer, or the like. A haptic module 153 can generate various tactile effects that the user can feel. A typical example of the tactile effect generated by the haptic module 153 may be vibration. The intensity, profile and the like of the vibration generated by the haptic module 153 may be controllable by user selection or controller setting. For example, the haptic module 153 may output different vibrations in an associated or sequential manner. In addition to vibration, the haptic module 153 may generate various other tactile effects, including a pacing effect such as a pin-like arrangement moving vertically with respect to a contact skin, a sputtering force, or a suction force. air through an injection port or suction opening, a touch on the skin, an electrode contact, an electrostatic force, etc., an effect by reproducing the sensation of cold and heat by using a element that can absorb or generate heat, and the like. The haptic module 153 can be implemented to allow the user to feel a tactile effect via muscle sensation, for example by the fingers or the arm of the user, as well as to transfer the tactile effect by contact direct. Two or more haptic modules 153 may be provided depending on the configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate event generation using light from a light source. Examples of events generated in the mobile terminal 100 may include message reception, call waiting reception, missed call, alarm, calendar notification, e-mail reception, receipt of information. through an application, and the like. A signal outputted from the optical output module 154 may be implemented such that the mobile terminal transmits monochromatic light or light with a plurality of colors. The output signal may be terminated when the mobile terminal detects an event check by the user. The interface unit 160 can serve as an interface with each external device connected to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted from an external device, receive energy electrical to be transferred to each element within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to an external device. For example, the interface unit 160 may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input / output (I / O) ports, video I / O ports, listener ports, or the like. The identification module may be a chip that stores various information to authenticate the authority of the use of the mobile terminal 100 and may include a user identification module (UIM), a subscriber identification module (SIM) , and a Universal Subscriber Identification Module (USIM), and the like. In addition, the device having the identification module (called "identification device", hereinafter) can take the form of a smart card. Therefore, the identification device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external dock, the interface unit 160 can be used as a passageway to allow electrical power from the docking station to be supplied to the mobile terminal 100 by the intermediate thereof or may serve as a passage to allow various control signals entered by the user, the docking base to be transferred to the mobile terminal through it. Various control signals or an electrical power input from the docking stand can serve as signals to recognize that the mobile terminal is properly mounted on the docking station. Memory 170 may store programs for controller 180 operations and temporarily store input / output data (eg, directory, messages, inanimate images, videos, etc.). The memory 170 can store related data to various vibration and audio profiles that are outputted in response to touch inputs on the touch screen. The memory 170 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia microcard type, a card-type memory (for example, SD or DX memory, etc.), a random access memory ( RAM), a static random access memory (SRAM), a read only memory (ROM), an erasable and electrically programmable read only memory (EEPROM), a programmable read only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk. Also, the mobile terminal 100 can be operated with respect to a web storage device that performs the storage function of the memory 170 via the Internet. As mentioned above, the controller 180 can typically control the general operations of the mobile terminal 100. For example, the controller 180 can set or release a lock state to prevent a user from entering a control command with respect to applications when a state of the mobile terminal satisfies a pre-set condition. The controller 180 may also perform control and processing associated with voice calls, data communications, video calls, and the like, or perform profile recognition processing to recognize a handwritten input or a drawing input made on the device. touch screen in the form of characters or images, respectively. In addition, the controller 180 may control one or a combination of these components to implement various illustrative embodiments described herein on the mobile terminal 100. The power supply unit 190 may receive external electric power or internal electrical energy and provide appropriate electrical energy required to operate respective elements and components included in the mobile terminal 100 under the control of the controller 180. The power supply unit Electric 190 may include a battery. The battery may be a built-in battery that is rechargeable or separably coupled to the terminal body for charging. The power supply unit 190 may include a connection port. The connection port may be configured, by way of example, of the interface unit 160 to which an external (re) charger, for providing electrical power to recharge the battery, is electrically connected. As another example, the power supply unit 190 may be configured to recharge the wireless battery without using the connection port. Here, the electric power supply unit 190 can receive electrical energy, transferred from an external wireless electrical energy transmitter, using at least one of an induction coupling method which is based on magnetic induction or a magnetic resonance coupling method that is based on electromagnetic resonance. Various embodiments described herein may be implemented in a computer readable medium or similar medium using, for example, software, hardware, or any combination thereof. Referring to Figures 1B and 1C, the mobile terminal 100 described herein may be provided with a bar-type terminal body. However, the present disclosure may not be limited to this, but may also be applicable to various structures such as a watch type, a bar type, an eyeglass type, or a folding type, flap-type, type slider, swing type, pivot type, or the like, wherein two or more bodies are relatively movably associated with each other. Here, the terminal body can be understood as being a design that designates the mobile terminal 100 as being at least one set. The mobile terminal 100 may include a housing (envelope, housing, cover, etc.) forming the appearance of the terminal. In this embodiment, the housing can be divided into a front housing 101 and a rear housing 102. Various electronic components can be incorporated into a space formed between the front housing 101 and the rear housing 102. At least one middle housing can more be disposed between the front housing 101 and the rear housing 102. A display unit 151 may be disposed on a front surface of the terminal body to produce output information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some cases, electronic components may also be mounted on the back box 102. Examples of such electronic components mounted on the back box 102 may include a separable battery, an identification module, a memory card, and the like. Here, a rear cover 103 for covering the mounted electronic components can be releasably coupled to the rear housing 102. Thus, when the rear cover 103 is separated from the rear housing 102, the electronic components mounted on the rear housing 102 can be externally exposed. . As illustrated, when the rear cover 103 is coupled to the rear housing 102, a side surface of the rear housing 102 may be partially exposed. In some cases, during mating, the rear housing 102 may also be completely protected by the rear cover 103. On the other hand, the rear cover 103 may include an opening for externally exposing a 121b image capture apparatus or audio output module 152b. The housings 101, 102, 103 may be formed by injection molding a synthetic resin or may be formed of a metal, for example, stainless steel (STS), titanium (Ti), or the like. Unlike the example in which the plurality of housings form an interior space for accommodating such various components, the mobile terminal 100 may be configured such that a housing forms the interior space. In this example, a mobile terminal 100 having a uni-body formed such that synthetic resin or metal extends from a side surface to a back surface can also be implemented. On the other hand, the mobile terminal 100 may include a water seal unit (not shown) to prevent introduction of water into the terminal body. For example, the water seal unit may include a water seal member that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the rear cover 103, for sealing an interior space when these housings are coupled. The mobile terminal 100 may include a display unit 151, first and second audio output modules 152a and 152b, a proximity sensor 141, a lighting sensor 152, an optical output module 154, first and second devices 121a and 121b views capture, first and second handling units 123a and 123b, a microphone 122, an interface unit 160 and the like. Hereinafter, a description will be provided of an illustrative mobile terminal 100 in which the display unit 151, the first audio output module 152a, the proximity sensor 141, the illumination sensor 142, the display module optical output 154, the first image capture apparatus 121a and the first manipulation unit 123a are disposed on the front surface of the terminal body, the second manipulation unit 123b; the microphone 122 and the interface unit 160 are disposed on a side surface of the terminal body, and the second audio output module 152b and the second image capture apparatus 121b are disposed on a rear surface of the terminal body, with reference to Figure IC. Here, these components may not be limited to the arrangement, but may be excluded or arranged on another surface if necessary. For example, the first handling unit 123a may not be disposed on the front surface of the terminal body, and the second audio output module 152b may be disposed on the side surface other than the rear surface of the terminal body. The display unit 151 may output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the screen execution information. The display imitate 151 may include at least one of a liquid crystal display (LCD) screen, a thin film transistor (TFT-LCD) liquid crystal display, an organic light emitting diode (OLED), a flexible display screen, a three-dimensional (3D) display screen, an electronic ink display screen. The display unit 151 may be implemented in number of two or more, according to a configured aspect of the mobile terminal 100. For example, a plurality of the display units 151 may be arranged on a surface to be spaced apart. others or integrated with each other, or can be arranged on different surfaces. The display unit 151 may include a touch sensor that detects a touch on the display unit to receive touch control control. When a touch is input to the display unit 151, the touch sensor can be configured to detect that touch and the controller 180 can generate a control command corresponding to the touch. Content that is entered in a tactile manner can be a textual or numeric value, or a menu item that can be specified or designated in various modes. The touch sensor can be configured as a film having a touch pattern. The touch sensor may be a wire, which is disposed between the window 151a and a display (not shown) on a rear surface of the window 151a or patterned directly on the rear surface of the window 151a. Or, the touch sensor can be integrally formed with the display device. For example, the touch sensor may be disposed on a substrate of the display device or within the display device. The display unit 151 may form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure IA). Thus, the touch screen can replace at least some of the functions of the first handling unit 123a. The first audio output module 152a may be implemented as a receiver for transferring voice sounds to the user's ear or to a speaker to output various alarm sounds or multimedia reproduction sounds. The window 151a of the display unit 151 may include a sound hole for outputting sounds generated from the first audio output module 152a. Here, the present disclosure may not be limited to this. It can also be configured so that the sounds are tolerated along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be seen or may be hidden in appearance, thus further simplifying the appearance of the mobile terminal 100. The optical output module 154 may output light to indicate event generation. Examples of the event generated in the mobile terminal 100 may include message reception, call waiting reception, missed call, alarm, calendar notification, e-mail reception, reception of information via an application, and the like. When an event check by the user is detected, the controller may control the optical output unit 154 to stop light output. The first image capture apparatus 121a may process video views such as inanimate or animated images obtained by the image sensor in a video call mode or a capture mode. The processed video views may be displayed on the display unit 151 or stored in the memory 170. The first and second handling units 123a and 123b are examples of the user input unit 123, which can be manipulated by a user to input a command to control the operation of the mobile terminal 100. The first and second units of 123a and 123b may also be called, in common, manipulation portion, and may employ any method, if it is a tactile manner allowing the user to perform manipulation with a tactile sensation, such as a touch, a push, scroll or the like. The drawings are illustrated according to the fact that the first handling unit 123a is a touch key, but the present disclosure may not necessarily be limited to this. For example, the first handling unit 123a can be configured with a mechanical key, or a combination of a touch key and a push button. The content received by the first and second handling units 123a and 123b can be set in various ways. For example, the first handling unit 123a can be used by the user to enter a command, such as a menu, a start key, a cancel, a search, or the like, and the second handling unit 123b can be used by the user to input a command, such as controlling a volume level outputted from the first or second audio output module 152a or 152b, switching to a touch recognition mode of the audio unit. display 151, or the like. On the other hand, as another example of the user input unit 123, a rear input unit (not shown) may be disposed on the rear surface of the terminal body. The rear input unit may be manipulated by a user to enter a command to control an operation of the mobile terminal 100. The inputted content may be set in a variety of ways. For example, the rear input unit may be used by the user to enter a command, such as on / off, start, end, scroll or the like, control of a volume level produced by the user. outputting from the first or second audio output module 152a or 152b, switching to a touch recognition mode of the display unit 151, or the like. The rear input unit can be implemented in a form allowing a touch input, a push input or a combination thereof. The rear input unit may be arranged to overlap the display unit 151 of the front surface in a thickness direction of the terminal body. For example, the rear input unit may be disposed on an upper end portion of the rear surface of the terminal body such that a user can easily manipulate it using an index when the user grips the terminal body with one hand. However, the present disclosure may not be limited to this, and the position of the rear input unit may be changeable. When the rear input unit is disposed on the rear surface of the terminal body, a new user interface can be implemented using the rear input unit. Also, the aforementioned touch screen or the rear input unit can substitute for at least a portion of functions of the first handling unit 123a located on the front surface of the terminal body. Therefore, when the first handling unit 123a is not disposed on the front surface of the terminal body, the display unit 151 may be implemented to have a larger screen. On the other hand, the mobile terminal 100 may include a digital scanning sensor that scans a fingerprint of the user. The controller may use fingerprint information detected by the digital scan sensor as a means of authentication. The digital scanning sensor may be installed in the display unit 151 or the user input unit 123. The microphone 122 may be formed to receive the user's voice, other sounds, and the like. The microphone 122 may be provided at a plurality of locations, and configured to receive stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to exchange data with external devices. For example, the interface unit 160 may be at least one of a connection terminal for connecting to another device (e.g., a headset, an external speaker, or the like), a port for the near-field communication (e.g., an IrDA port, a Bluetooth port, a wireless LAN port, and the like), or an electrical power supply terminal for supplying electrical power to the mobile terminal 100. The unit interface 160 may be implemented as a socket for housing an external card, such as a subscriber identification module (SIM), or a user identification module (UIM), or a memory card for storing information. The second image capture apparatus 121b may further be mounted on the rear surface of the terminal body. The second image capture apparatus 121b may have an image capture direction which is substantially opposite to the direction of the first camera unit 121a. The second image capture apparatus 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. View capturing devices may be referred to as "networked image capture devices". When the second image capture apparatus 121b is implemented as a networked image capture apparatus, images can be captured in various ways using the plurality of lenses and images with better qualities can be obtained. A flash 124 may be disposed adjacent to the second image capture apparatus 121b. When an image of a subject is captured with the image capture apparatus 121b, the flash 124 may illuminate the subject. The second audio output module 152b may further be disposed on the terminal body. The second audio output module 152b may implement stereophonic sound functions together with the first audio output module 152a (refer to Fig. 1A), and may also be used to implement a speakerphone mode. for call communication. At least one antenna for wireless communication may be disposed on the terminal body. The antenna can be installed in the terminal body or formed on the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 (see Fig. 1A) may be retractable into the terminal body. Alternatively, an antenna may be in the form of a film for attachment to an inner surface of the back cover 103 or a housing including a conductive material may serve as an antenna. An electric power supply unit 190 for supplying electrical power to the mobile terminal 100 may be disposed on the terminal body. The power supply unit 190 may include a battery 191 which is mounted in the terminal body or releasably coupled to an outside of the terminal body. The battery 191 can receive electrical energy via an electrical power source cable connected to the interface unit 160. Also, the battery 191 can be (re) chargeable wirelessly using a wireless charger. The wireless charge can be implemented by magnetic induction or electromagnetic resonance. On the other hand, the drawing illustrates that the rear cover 103 is coupled to the rear case 102 to protect the battery 191, in order to prevent the separation of the battery. the battery 191 and protect the battery 191 from external impact or foreign matter. When the battery 191 is separable from the terminal body, the rear housing 103 can be separably coupled to the rear housing 102. An accessory for protecting an appearance or assisting or improving the functions of the mobile terminal 100 may also be provided on the mobile terminal 100. As an example of the accessory, a cover or cover for covering or housing at least one surface of the mobile terminal 100 may be provided. The lid or pouch may cooperate with the display unit 151 to enhance the function of the mobile terminal 100. Another example of the accessory may be a touch pen to assist or enhance a touch input on a touch screen. On the other hand, a mobile terminal according to the present disclosure provides a function capable of performing a location check depending on where a body of the mobile terminal (hereinafter called "body") is located. For this purpose, an external communication device may be preinstalled at a predetermined location. The external communication device may be a device capable of communicating with the body. In other words, the external communication device can make a communication with the wireless communication unit 110 of the body. Specifically, the external communication device may be implemented with one or more tags. A beacon, as a Bluetooth 4.0 wireless short-distance communication device, can communicate with devices within the maximum distance of 70 m. For example, when the beacon is installed at the predetermined location, the fact that the body is located at the predetermined location or not can be determined according to the signal strength of a beacon received at the communication unit. wireless 110. Specifically, if the size (or radius) of a predetermined location is about A (m), and the intensity received from the beacon is B (dB) when the body is located about A (m) from the beacon, then the controller 180 can determine that the body is located within the predetermined location when the intensity of the beacon received from the body is greater than B (dB). On the other hand, according to the present disclosure, the external communication device may include a beacon, a Wi-Fi module, or the like. In this case, a mobile terminal can acquire the location of the mobile terminal based on information from a wireless access point (AP) transmitting or receiving a wireless signal to, or from, the Wi-Fi module. According to the present disclosure, the predetermined location may be pre-set by a user. In other words, the external communication device may be previously installed by the user at the predetermined location. In addition, there may be a plurality of predetermined locations. In this case, the predetermined places can be called: a first place, a second place, a third place, ..., an nth place. For example, where the predetermined locations are a user's home, office, and car, the locations may be sequentially called: a first place, a second place, and a third place. A mobile terminal according to the present disclosure may perform a control associated with the location according to a location at which the body is located. In other words, the mobile terminal can perform a control associated with a first place when the body is located in the first place, and perform a control associated with a second place when the body is located in the second place. Moreover, each place can be divided into a plurality of regions. For example, the first place can be divided into a region 1-1, a region 1-2, a region 1-3, ..., a (l-n) th region. Each region can correspond to bedroom 1, bedroom 2, living room, ..., bathroom, and the like. In this case, a mobile terminal according to the present disclosure may perform a control associated with a region depending on the region at which the body is located among a plurality of locations. This will be described in more detail below with reference to FIG. In addition, hereinafter, in the description of the present disclosure with reference to the accompanying drawings, when at least two images are illustrated as 2 by 2 in a drawing (see, for example, FIG. 2B), an illustrated image in the upper left corner, an image in the upper right corner, an image in the lower right corner, and an image in the lower left corner will be called: a first drawing, a second drawing, a third drawing, "and a fourth drawing, respectively. In addition, when at least two images are shown in columns, from the top down direction, in a drawing, the images will be sequentially called: a first drawing, a second drawing, ... from the highest image. In addition, when at least two images are illustrated in a row, from left to right, in a drawing, the images will be sequentially called: a first drawing, a second drawing, ... from the most to the left. On the other hand, when a request from the user to perform a control associated with the predetermined location is received, a mobile terminal according to the present disclosure may display a control screen for controlling the location. This will be described below in detail with reference to the drawings. Fig. 2A is a flowchart for explaining a control method associated with a predetermined location according to the present disclosure, and Figs. 2B and 2C are conceptual views for explaining the control method of Fig. 2A. Referring to Fig. 2A, the controller 180 detects that the body is communicating with an external communication device installed at a predetermined location (S210). Specifically, when the body is located at a predetermined location, the wireless communication unit 110 of the body communicates with an external communication device installed at the predetermined location. Here, the external communication device may include at least one of one or more tags and one or more Wi-Figure modules, as described above. When the wireless communication unit 110 is connected to the wireless external communication device, the controller 180 can determine that the body is located at the predetermined location. Alternatively, when a signal of the external communication device received from the wireless communication unit 110 is greater than a predetermined intensity, the controller 180 can determine that the body is located at the predetermined location. On the other hand, when it is determined that the body is located at a predetermined location, the controller 180 may control the display unit 151 to display notification information indicating that a control over the location is permitted for an user. Here, the notification information can be displayed in various ways. The notification information may be displayed using at least one of auditory, tactile and visual modes. The output mode of the notification information can be preset by the user. For example, displaying the notification information in a visual mode, with reference to the first drawing of Fig. 2B, the notification information can be displayed as a notification icon 210. Specifically, when the controller 180 determines that the body is located at a predetermined location in a state where a predetermined screen is displayed on the touch screen 151, the notification icon may be displayed on one side of the display bar. 220. On the other hand, it is illustrated in the drawing that the predetermined screen is a home screen 230, but the predetermined screen may also be at least one of a lock screen and a display screen. executing a user-preset application in addition to the home screen. Although not shown in the drawing, for example additional display of the notification information in a visual mode, the notification information can be displayed as a notification window. Specifically, when the controller 180 determines that the body is located at a predetermined location in a state where a predetermined screen is displayed on the touch screen 151, the notification window may be displayed at an arbitrary location of the touch screen 151. Further, when the controller 180 determines that the body is located at a predetermined location in a state where the touch screen 151 is in an inactive state, the touch screen 151 may be activated to display the notification icon 210 or the notification window, as described above, if a predetermined screen is displayed. The user may recognize that control over the predetermined location is currently permitted through the display of the notification icon 210 or the notification window. On the other hand, the controller 180 may receive a request from the user to perform a control associated with the predetermined location (S220). Subsequently, the controller 180 controls the touch screen 151 to display a control screen 250 to perform a control associated with the predetermined location according to the request of the received user (S230). Here, the user's request can be entered in various ways. For example, although not shown in the drawing, the user's request may be received when a hardware key provided in the body of the mobile terminal body is pressed by the user. On the other hand, when the hardware key is a touch key, the user's request can be received when a touch is applied to the touch key. For example, with reference to FIG. 2B, the user's request may be a predetermined type of touch 241 applied to the touch screen 151. The predetermined touch type 241 is related to a display function of the control screen 250 to perform a control associated with the predetermined location. That is, when the predetermined type of touch 241 is applied to the touch screen 151 in a state where a predetermined screen 230 is displayed thereon, the control screen 250 to perform a control associated with the predetermined location is displayed. In other words, a screen conversion of the predetermined screen 230 to the control screen 250 is performed on the touch screen 151. Here, it is illustrated in the first drawing of Figure 2B that the predetermined screen is a home screen, but the present disclosure may not necessarily be limited to this. In other words, the predetermined screen may be any one of a home screen, a lock screen and a screen for executing a predetermined application. Further, FIG. 2B illustrates the predetermined type of touch 241 applied to the predetermined screen in a state where the previous notification information (notification information indicating that a control on a predetermined location is activated) 210 is displayed on that screen. here, but the present disclosure may not necessarily be limited to this. In other words, if the body and the external communication device are connected to each other wirelessly, the control screen may be displayed according to the predetermined type of touch 241 applied thereto, regardless of whether the notification icon 210 is displayed on it or not. On the other hand, a predetermined type of touch related to a display function of the control screen can be performed in various modes. Referring to Figs. 2B and 2C, the predetermined type of touch may be any of a spread nip 241, a multi-touch 242, and the like. On the other hand, the present disclosure may not necessarily be limited to the preceding touch modes. For example, the preset type of touch can include various types of touch. For example, the various types of touch can include a short touch (or tap), a long touch, a double touch, a multi-touch, a touch-and-slide, a quick touch, a close-in pinch, a touch nip-to-nip, a slip-on, a floating feel, and the like. Hereinafter, the various types of touches will be described in more detail. A short touch (or tap) may be a touch in which a touch object (eg, a finger, a stylus, etc.) is in contact with the touch screen 151 (or a touch is applied) and then released within a predetermined period. For example, the short touch (or the tap) may be a touch in which a touch object is in touch with the touch screen for a short time, such as a single mouse click. A long touch may be a touch in which a touch object is in contact with the touch screen 151 and then held for more than a predetermined period. For example, the long touch may be a touch in which a touch is applied to the touch screen 151 by a touch object and then the touch is held for more than a predetermined period. More specifically, the long touch can be a touch in which the touch is held at a position on the touch screen for a predetermined period and then released from it. In addition, the long touch can be understood as a touch corresponding to a touch-and-hold operation in which the touch state of a touch object is maintained on the touch screen 151 for more than a predetermined period of time . A double touch can be a touch in which the short touch is consecutively applied to the touch screen 151 at least twice within a predetermined period. A predetermined period described in the short touch, long touch and double touch can be determined by user adjustment. A multi-touch may be a touch applied to at least two touch positions on the touch screen 151 at substantially the same time. Moreover, the multi-touch can be a touch applied to a predetermined number of two or more touch points on the touch screen for more than a predetermined period of time. A touch-and-slide may be a touch in which a contact started from a first position of the touch screen 151 is consecutively applied to the touch screen along a direction and then the contact is released from a second position different from the first position. Specifically, the touch-and-slide may be a touch, applied to a position of the touch screen 151 by a touch object, consecutively extended while being maintained, on the touch screen 151 and then released from a position other than said one position. In addition, the touch-and-slide may denote a touch in which touch is applied to a position of the touch screen 151 and then consecutively extended from the touch. A quick touch may be a touch in which the touch-and-slide is applied within a predetermined period. Specifically, the quick touch may be a touch in which a tactile object applying touch-and-slide is released from the touch screen 151 within a predetermined period of time. In other words, the quick-hitting touch can be understood as a touch-and-slide applied at a speed above a predetermined speed. A sliding touch can be a touch-and-slide applied in a straight line. A close-in pinch-touch may be a touch in which at least one of a first touch and a second feel applied to two different positions (two separate positions) on the touch screen 151 is extended in a direction toward each other. For example, the close-in pinching touch may be a touch implemented by an operation of reducing a distance between two fingers in a state where the fingers are in contact with two separate positions, respectively, on the touch screen 151. A gap nip may be a touch in which at least one of a first feel and a second feel applied to two different positions (two separate positions) on the touch screen 151 is extended in a mutually spaced direction. For example, the spreading nip may be a touch implemented by an operation of increasing (extending) a distance between two fingers in a state where the fingers are in contact with two separate positions, respectively , on the touch screen 151. A floating touch may be a touch corresponding to the operation of a touch object in a space remote from the touch screen 151 while the touch object is not in contact with the touch screen 151, and for example , may be a touch in proximity as shown in Figure IA. For example, the floating touch may be a touch corresponding to an operation in which the touch object is held at a position separate from the touch screen 151 for more than a predetermined period. According to the present disclosure, it will be described that the preset type of touch is a touch-slip, for example. However, the various previous types of touches will be applied analogically to the preset type of touch in the same or similar manner. According to the present disclosure, the control screen 250 may be a screen for controlling external devices arranged at the predetermined location. More specifically, the control screen 250 may include a map image for at least a partial region of the predetermined location to intuitively provide the arrangement relationship or the like between the external devices within the predetermined location of the location. user. Referring to the fourth drawing of Fig. 2B, the control screen 250 may overlap the map image, and may include image objects 251, 252 corresponding to the external devices. The status information or notification information on external devices corresponding to the image objects 251, 252 may be displayed to overlap the image objects 251, 252. In addition, when a touch is applied to the image objects 251, 252, a control on external devices corresponding to the image objects can be performed. This will be described below in detail. In addition, the map image may be an image for various regions (a first region, a second region, ..., an n-th region) of the predetermined location. In other words, the predetermined location may be the same, but map images for each region may be different. For example, when a map image of the predetermined location includes a map image for a first region and a second region, the first region and the second region may be different regions. Here, the first region and the second region may be non-overlapping regions with each other at the predetermined location. Alternatively, the first region and the second region may be regions in which any region contains another region. More specifically, when a predetermined type of first touch is applied in a state where a map image for the first region is displayed on the touch screen 151, a map image for the second region can be displayed. In addition, when a predetermined type of second touch is applied in a state where a map image for the second region is displayed thereon, a map image for the first region may be displayed. In other words, map images for the first region and the second region can be converted into each other, and displayed on the touch screen 151. This will be described below in detail. On the other hand, the map image may include various types of map images. For example, the map image may be any of a three-dimensional map image, a plan image, and a stereoscopic view image corresponding to a partial region of the predetermined location. Similar to the preceding description, when a predetermined type of first touch is applied in a state where any type (called a first type) of map image is displayed on the touch screen 151, another type (called a second type) of map image can be displayed. In addition, when a predetermined type of second touch is applied in a state where a second type of map image is displayed thereon, a first type of map image may be displayed again. This will be described below in detail with reference to the accompanying drawings. Hereinafter, a three-dimensional map image among the previous types of map images will be described first, and then a plan view image will be described. In addition, a control method for mutual conversion between three-dimensional map images and plan view images will be described. Figs. 3A and 3B are conceptual views for explaining a three-dimensional map image included in a control screen associated with the present disclosure. Referring to the first drawing of Figure 3A, the control screen may include a three-dimensional map image 310 for at least a partial region of the predetermined location. The three-dimensional cartographic image may be at least one of a three-dimensional cartographic image pre-stored in the memory or an image received in real time from a camera for capturing views provided in the body. According to the present specification, the three-dimensional cartographic image will be described according to a three-dimensional cartographic image pre-stored by the user. In addition, according to this specification, the three-dimensional map image will be described on the assumption that the predetermined location is a three-dimensional map image for a first location ("user's home"). On the other hand, according to the present disclosure, the three-dimensional map image may be displayed to correspond to a current location and a detected body direction within the predetermined location. Specifically, as described above, a first location may be partitioned into a plurality of regions. For example, the first place can be partitioned into a 1-1 region, a 1-2 region, a 1-3 region, ..., an (l-n) th region. In this case, the three-dimensional map image may include a plurality of external device map images corresponding to the plurality of regions. In other words, a three-dimensional map image corresponding to a region in which the current location of the body is detected among the three-dimensional map images can be displayed on the touch screen. Therefore, a three-dimensional map image corresponding to a region to which the user is currently located can be displayed on the control screen. Moreover, the controller 180 may display the three-dimensional map image to correspond to a direction directed from the detected body or an inclined direction of the detected body. Here, the direction directed from the body can be detected via a view capturing apparatus provided in the body. For example, in this case, a portion of the three-dimensional map image corresponding to an image received from the view capturing apparatus can be extracted. In addition, the inclined direction of the body can be detected by at least one of a geomagnetic sensor, a gyro sensor and an acceleration sensor. Referring to Figures 3A and 3B, when a direction directed from the body is changed, three-dimensional map images 310,320 displayed on the touch screen can be changed to match the changed direction. In summary, the external device map image displays a portion corresponding to the current location and direction of the body 100 on the touch screen. For example, with reference to Figure 3A, a three-dimensional map image 310 for a space 310 'corresponding to a direction directed from the mobile terminal 100 is displayed on the touch screen 151. Therefore, the three-dimensional map image 310 may include image objects 311, 312 corresponding to external devices 31Γ, 312 'included in the space 310' corresponding to a direction directed from the body 100. Referring to Fig. 3B, when a direction directed from the body is changed, a three-dimensional map image 320 for a space 320 'corresponding to the changed direction of the body 100 is displayed on the touch screen 151. In other words, the three-dimensional cartographic image 320 may include image objects 321,322 corresponding to external devices 32Γ, 322 'included in the space 320' corresponding to a direction currently directed from the body 100. Therefore, a portion of a three-dimensional map image corresponding to a space directly viewed by the user can be displayed on the touch screen 151. In other words, the user can intuitively visualize a three-dimensional map image displayed on the touch screen 151 in a way corresponding to its real space. Alternatively, although not shown in the drawing, unlike the example described, even if a direction directed from the body is changed, a three-dimensional map image displayed on the touch screen can maintain an initially displayed image such that is. In this case, when a predetermined touch (for example, a fast-flick contact corresponding to a horizontal direction) is applied to the three-dimensional cartographic image, a three-dimensional cartographic image moved in a direction corresponding to a direction of touch applied to that it can be displayed. On the other hand, a case where the predetermined location is a first place ("home of the user") is illustrated in the drawing as an example, but there may be various three-dimensional map images depending on a location. where the body is located. Moreover, according to the present disclosure, when a predetermined type of touch is applied to the touch screen 151 in a state where the electronic device 100 (for example, mobile terminal) is displayed on it, a view image in plan for a partial region including a current location of the body may be displayed within the predetermined location. Here, the plan view image may be an image corresponding to an image in which at least a partial region of a predetermined location is viewed from the top. More specifically, the plan view image may be a horizontal projection view wherein at least a partial region of the predetermined location is cut on a horizontal surface at a predetermined height. For example, when a current location of the body is located in a first region (eg, living room) of a predetermined location (eg, home of the user), the plan view image may be a image showing the first region seen from the top. On the other hand, although not shown in the drawing, the plan view image may be displayed to correspond to a direction directed from the body when the plan view image is displayed. For example, the plan view image can be displayed in such a way that external devices corresponding to a direction directed from the body are arranged on the high side of the plan view image among the high side, the side bottom, the left side and the right side of the plan view image. Further, as described above, an image corresponding to an external device may be displayed to overlap the plan view image. In other words, the image object may be located to correspond to a location on the first region to which the external device is disposed on the plan view image. Therefore, the user can control the external device by using the plan view image intuitively and quickly. This will be described below in more detail. On the other hand, as described above, the plan view image may be displayed to correspond to a predetermined type of touch applied in a state where the three-dimensional map image is displayed. Hereinafter, a method of mutual conversion control between the images will be described in detail with reference to the accompanying drawings. Figs. 4A-41 are conceptual views for explaining a control method for mutual conversion between various cards included in a control screen associated with the present disclosure. Referring to the first and second drawings of Fig. 4A, when a predetermined type of touch 410 is applied to the three-dimensional map image 411 in a state where the three-dimensional map image 411 is displayed on the touch screen 151, a 412 plan view image for a partial region including a current body location is displayed. Here, the predetermined type of touch 410 may be the same type of touch as that of a predetermined type of touch related to a display function of a control screen, as described above. In other words, a predetermined type of touch related to a display function of a control screen is a touch-pinch touch or touch-to-close pinch, the predetermined touch type related to a display function of the control screen. The plan view image may also be a nip pinch or close pinch touch. In the drawing, a case where the predetermined type of touch is a gap nip is illustrated. Referring to the second drawing of Figure 4A, a plan view image 412 for a partial region of the predetermined location is displayed on the touch screen 151. As described above, the partial region may be a first region (for example, living room) including a current location of the body. Subsequently, with reference to the second and third drawings of Fig. 4A, when the predetermined type of touch (pinch nip, as described above) 410 is applied in a state where the view image in plan 412 for the first region is displayed, a plan view image 413 for a second region including the first region may be displayed. Here, the second region may denote a region larger than the first region including the first region. For example, referring to the third drawing of Figure 4A, the second region may be the predetermined location (eg, home of the user) itself. Subsequently, when a predetermined type of touch (pinch nip, as described above) 410 is applied to the plan view image 413 in a state where the plan view image 413 for the predetermined location is displayed, an image 414 for the predetermined location and the surrounding locations of the predetermined location may be displayed. Surrounding areas can be preset by the user. For example, where the predetermined location is a user's home, the surrounding locations may be a parking lot, a playground and the like used by the user. In other words, with reference to FIG. 4A, a map image for at least partial region of a predetermined location is displayed on a control screen to perform a control associated with the predetermined location. When a predetermined type of touch is applied to the map image, a different map image of a currently displayed map image is displayed. That is, with reference to Fig. 4A, a plan view image 412 for a first region is displayed when a gap nip 410 is applied in a state where a three-dimensional map image 411 is displayed, and an image in plan view 413 for a second region (here, an entire predetermined location) including the first region is displayed when the gap nip 410 is subsequently applied once more. Subsequently, when the nip pinch 410 is applied once again, a plan view image 414 for the predetermined location and the surrounding locations of the location may be displayed. In other words, a map image for a larger area can be displayed each time the gap pinch 410 is applied to the touch screen 151. In addition, with reference to Fig. 4B, when a close-in pinch 420 is applied to a plan view image 414 in a state where the plan view image 414 for the predetermined location and locations surrounding area is displayed, a plan view image 413 for a second region of the predetermined location is displayed. Subsequently, when the close-in pinch 420 is applied once more, a plan view image 412 for a first region included in the second region is displayed. Subsequently, when the close-in pinch 420 is applied once more, a three-dimensional map image 411 for the first image can be displayed. In other words, a map image for a smaller area can be displayed each time the proximity pinch 420 is applied to the touch screen 151. So far, a case where the body is located at a first place (for example, home of the user) among predetermined locations, is described. Even when the body is located at a second or third location different from the first place among the predetermined locations, a map image for a variety of regions, such as the first or second location, may be displayed in association with the predetermined type of touch. . For example, when a predetermined type of touch is applied to a predetermined screen depending on the body located at a second location (eg, user's desk), a control screen capable of performing a check on the second place is displayed. In this case, with reference to FIG. 4C, the control screen may be a three-dimensional map image 421 for at least a partial region of the second location. As described above, when a nip pinch 410 is applied to the three-dimensional map image 421, a plan view image 422 for a partial region of the second location is displayed. When a nip pinch 410 is applied to the plan view image 422, a plan view image 423 for a larger region of the second location including the partial region is displayed. Subsequently, when the nip pinch 410 is applied to the plan view image 423, a plan view image 424 for the second location and the surrounding locations (e.g., parking lot) of the second location can be displayed. Further, with reference to Fig. 4D, even when the body is located at the third location (eg, rover car), a third place map image for a larger area may be gradually displayed in association with the pinch feel at spacing 410. For example, a map image initially displayed during the entry of the control screen may be a three-dimensional map image 431 for a driver's seat. When a nip press 410 is applied to the image 431, a plan view image 432 for the driver's seat may be displayed. Subsequently, a 433 plan view image for all seats inside the user's car and a 434 plan view image for the user's entire car can be sequentially displayed on the screen. Touch 151. As shown in FIGS. 4A-4D, a user can more effectively perform a check on a predetermined location by using a map image for a variety of regions of a predetermined location. On the other hand, with reference to FIG. 4E, when a predetermined type of touch (for example, a quick touch) 430 is applied to the touch screen 151 in a state where a plan view image 441 for a partial region of a predetermined location is displayed thereon, a stereoscopic view image 442 for the same region can be displayed. That is, when a predetermined type of touch 430 is applied to the touch screen 151 in a state where a first type of image 441 corresponding to a partial region of a predetermined location is displayed on the touch screen 151, a second type of image 442 corresponding to the partial region can be displayed. Further, when a predetermined touch type 430 'is applied to the touch screen 151 in a state where a second image type 442 corresponding to a partial region of a predetermined location is displayed on the touch screen 151, a first type of image 441 corresponding to the partial region may be displayed. On the other hand, in a mobile terminal according to the present disclosure, different types of map images can be displayed on the touch screen 151 depending on the posture of the mobile terminal 100. More specifically, the detection unit 140 can detect an angle between the body 100 and the ground. Referring to FIG. 4F, when an angle between the body 100 and the ground is greater than a predetermined value, a first type of image (for example, a stereoscopic view image) 451 with respect to a partial region of a predetermined place can be displayed. In addition, when an angle between the body 100 and the ground is less than a predetermined value, a second type of image (for example, a plan view image) 452 for the same region can be displayed. As a result, a user can more effectively perform control over a predetermined location using various types of map images from a predetermined location. Further, according to the present disclosure, when a predetermined type of touch is applied to the touch screen in a state where an image corresponding to a partial region of a predetermined location is displayed on the touch screen 151, a corresponding image to a different region from the predetermined location can be displayed. Referring to FIG. 4G, when a predetermined type of touch (for example, a fast touch with respect to a direction of the touch screen) is applied in a state where a plan view image 461 for a first region (e.g., chamber) of a predetermined location is displayed on the touch screen 151, a plan view image 462 for a second region (e.g., workplace) of the predetermined location is displayed. Similarly, when a predetermined type of touch is applied in a state where a plan view image 462 for the second region (e.g., workplace) is displayed, a plan view image 463 for a third region (eg, toilet) is displayed. Similarly, when a predetermined type of touch is applied in a state where a plan view image 463 for the third region (eg, chamber) is displayed, a plan view image 464 for a fourth region (for example for example, small room) is displayed. In addition, with reference to FIG. 4G, an indicator 465 indicating that an image for a certain region is displayed on the touch screen 151. Referring to the first drawing of Fig. 4H, a plan view image 481 for a first region (e.g., living room) of a predetermined location may be displayed together with an image object 482 displaying an entire map of the region. predetermined location on the touch screen 151. Referring to the second drawing of Figure 4H, when a predetermined type of touch is applied with respect to the image object 482, the image object 482 may be magnified. In addition, the image object 482 may include an object 482a highlighting a region in which the body 100 is currently located among the predetermined locations. On the other hand, with reference to the second and third drawings of FIG. 4H, when a predetermined type of touch is applied with respect to the plan view image 481, an object 482b highlighting a different region of the a region may be displayed on the image object 482 depending on the direction of the applied touch. On the other hand, referring to the third and fourth drawings of Fig. 4H, guide information 483 in a touch direction can be displayed in the vicinity of object 482b highlighting the different region on image object 482. More specifically, the guide information 483 in the tactile direction may include at least one type of direction information on the high side, the low side, the left side, and the right side. When a touch-and-slide corresponding to at least one direction of the high side, the low side, the left side, and the right side is applied according to the provided guide information 483, an object 482c highlighting another region on the object image 482 can be displayed depending on the direction of touch-drag applied. Although not shown in the drawing, when a predetermined type of touch is applied with respect to the image object 482, the image object 482 may no longer be displayed. In addition, a plan view image corresponding to the other region may be displayed. As described heretofore, a control screen for various regions of the predetermined location can be converted to others via the image object 482 to display the map of a predetermined location. Referring to the first and second drawings of Fig. 41, when a predetermined type of touch (e.g., a long touch) is applied to an image object 472 included in an image 471 in a state where the image 471 for a any region of a predetermined location is displayed on the touch screen 151, a graphic object 473 corresponding to a home screen may be displayed. Referring to the second and third drawings of Fig. 41, when there is a plurality of greeting screens, a plurality of graphic objects corresponding to the home screens, respectively, may be displayed. When any graphic objects 473a is selected, an icon 474 linked to an external device control screen corresponding to the image object 472 may be added to a home screen corresponding to the selected graphic object. On the other hand, with reference to Fig. 41, an indicator 475 indicating that a currently displayed image is an image for a certain region can be displayed in a region of the touch screen 151. So far, a case where various ranges and types of map images are converted to others, and displayed on the display unit, has been described. On the other hand, according to the present disclosure, the control screen may include an image object overlapping the map image and corresponding to the external device. It may be possible to display the status information and notification information of an external device or perform control of the external device through the image object. Hereinafter, the description of the display of the status information and the notification information of an external device through the image object will be provided with reference to FIGS. 5A and 5B, and then the description of performing control of an external device via the image object will be provided with reference to Figure 5C. Figs. 5A and 5B are conceptual views for explaining a control method for displaying status information or notification information of an external device using an image object associated with the present disclosure. In addition, Figures 5C and 5D are conceptual views for explaining a method of controlling an external device using an image object. Referring to the first drawing of Fig. 5A, a map image 510 for a partial region of a predetermined location, displayed on the touch screen 151, may include image objects 511, 512, 513 corresponding to external devices disposed in the partial region. Specifically, when there is a plurality of external devices, a plurality of image objects 511, 512, 513 corresponding to the external devices, respectively, are displayed on the map image. Each image object can have color information or edge information. Information about the state of the external devices can be displayed on the touch screen 151 via the color information or edge information of the image object. For one embodiment, when the image objects 511, 513 have first edge information 514, this may be a case where the electrical energy of an external device corresponding to the image object is extinguished. Further, when the image object 512 has second edge information 515, this may be a case where the electrical power of an external device corresponding to the image object is turned on. For one embodiment, when the image object has first color information, an external device corresponding to the image object may be in a paired state to the body. In addition, when the image object has second color information, an external device corresponding to the image object may be in a state not matched to the body. Furthermore, with reference to the first and second drawings of Fig. 5A, the notification icons 51Γ, 512 ', 513' of external devices corresponding to the image objects 511,512,513 can be displayed at a location overlapping, or adjacent to, the object picture. When a touch is applied to the notification icons 51Γ, 512 ', 513' or the picture objects 511,512,513, the notification information of the external devices can be displayed. Specifically, the detailed description of the notification information may vary for each external device. For example, as shown in the first and second drawings of Figure 5A, when a touch is applied to the image object 512 or the notification icon 512 corresponding to a "phone", the description of telephone numbers , which have been received by the "phone", but to which the user has not responded, unverified messages, or the like may be displayed. For example, additional notification information, as illustrated in the first and second drawings in Fig. 5B, when a touch is applied to an image object 516 or the notification icon 516 'corresponding to a "refrigerator", notification information about the "refrigerator" can be displayed. For example, with reference to the second drawing of Figure 5B, the controller 180 can compare the expiration date of food stored in the "refrigerator" to a current date to display information about the remaining time to the date expiration. Here, the expiration date may be pre-stored in memory or the like, by the user, when the food is stored in the refrigerator. So far, display modes or the like of the status information and the notification information of external devices using the image objects have been described. On the other hand, as described above, according to the present disclosure, it may be possible to control external devices corresponding to the image objects according to a touch applied to the image object. In other words, when a predetermined type of touch is applied to the image object, the controller 180 can generate a control command of an external device corresponding to the image object. The control command can be directly transmitted to the external device via the wireless communication unit 110 of the body. Alternatively, the control command may be transmitted to the external device through a predetermined server provided at a predetermined location. For example, with reference to FIG. 5C, when a predetermined type of touch is applied to an image object 521 corresponding to an external device (e.g., "lighting apparatus"), touch control can be performed for the external device (for example, "lighting fixture"). Specifically, the electrical energy of the "lighting apparatus" may be turned on according to a touch applied to the image object 521 in a state where the electrical energy of the "lighting apparatus" is turned off. For another embodiment, when a touch-and-slide is applied to a picture object 521 corresponding to the "lighting apparatus", the brightness of the "lighting apparatus" may vary depending on a length of the trajectory to slide. In addition, a brightness display bar 522 associated with the brightness of the "lighting apparatus" may be displayed at a location adjacent to the image object 521. The brightness display bar 522, 522 'of the "lighting apparatus" may be displayed differently depending on the brightness change measurement of the "lighting apparatus". On the other hand, although not shown in the drawing, when a touch-and-slide is applied to an image object corresponding to a "fan", an air flow of the "fan" can be changed according to a length of the trajectory to slide. Furthermore, with reference to FIG. 5D, when a predetermined touch (for example, a gap nip) is applied to an image object 531 corresponding to a "TV" 530 among external devices, information of 532 screen currently being displayed on the "TV" 530 can be transmitted to the body 100. As described above, the screen information 532 may be transmitted to the body 100 via the wireless communication unit 110 further provided in the "TV" 530 or transmitted to the body 100 by the intermediate of a predetermined server. Through this, a video that has been displayed by a TV set can be displayed on the touch screen 151 of the mobile terminal without a complicated process. As a result of the determination, the user can continuously view a video, which has been viewed via a TV set post, via the terminal coips, even when leaving his home. . So far, a method of allowing the controller to control an external device corresponding to an image object according to a predetermined type of touch applied to the image object included in a map image displayed on the touch screen 151 has been described. On the other hand, according to the present disclosure, when a touch is applied to the image object, an external device can be controlled according to an input by the user, entered by F through a window after the display of the window for the control of the external device corresponding to the image object. This will be described below in more detail with reference to the accompanying drawings. Figs. 6A to 6E are conceptual views for explaining a method of controlling an external device based on input by the user, inputted through a window associated with the present disclosure. Referring to Fig. 6A, on the touch screen 151, a plan view image 611 for a partial region of a predetermined location includes an image object for an external device disposed in the partial region. For example, when a touch is applied to an image object 61 corresponding to an "air conditioner" among the image objects, a first window 612 including a temperature control bar 612a and a second window 613 including weather information can be displayed. . The temperature and the amperage output from the air conditioner can be controlled in response to a predetermined type of touch applied to the temperature control bar 612a. For example, the temperature and the intensity of the air stream can be controlled according to a position on the temperature control bar 612a corresponding to a position from which the touch is released. Although not shown in the drawing, when a predetermined type of touch is applied to an image object corresponding to a "boiler", similar to the previous example, a window including a "temperature control bar" may be displayed. The temperature of a boiler may be controlled in response to a predetermined type of touch applied to the temperature control bar. Referring to Fig. 6B, similarly to the preceding example, when a predetermined type of touch (e.g., a short touch) is applied to an image object 621 corresponding to a "refrigerator", a window 622 including a temperature control bar 622a of the refrigeration chamber and / or the freezing chamber can be displayed. The temperature of the refrigeration chamber and / or the freezing chamber can be controlled in response to a predetermined type of touch applied to the temperature control bar 622a. Further, a predetermined type of touch (e.g., a gap nip, as shown in the drawing) is applied to the window 622, an image associated with an interior of the "refrigerator" may be displayed. The image 623 may be an image received in real time from a picture capture apparatus installed in the "refrigerator". Referring to Fig. 6C, similarly to the previous example, when a predetermined type of touch is applied to an image object 631 corresponding to a "TV set" or "set top box", a window capable of enabling a user to change a TV channel is displayed. Referring to the second and third drawings of Fig. 6C, even when a predetermined type of touch (e.g., a gap nip) is applied to the window 632, an image of a currently selected channel may be displayed on the touch screen 151 of the terminal. Alternatively, referring to the third and fourth drawings of Fig. 6C, an icon 633 corresponding to a mobile terminal may be displayed together with the window 632 on the touch screen 151. In this case, a touch-and-slide with a position on an image object 632a corresponding to the plurality of strings, as a starting point, and a position on an icon 633 corresponding to the mobile terminal, as a release point, may be applied. The image of a chain corresponding to the image object 632a can be displayed on the touch screen 151 by the touch-and-slide. On the other hand, an icon 634 corresponding to a TV set can be displayed together with the window 632 on the touch screen 151. In this case, a touch-and-drag with a position on an image object 632a corresponding to the plurality of strings, as a starting point, and a position on an icon 63 corresponding to the TV set, as the release point, may be applied. The image of a chain corresponding to the image object 632a can also be displayed on the TV set by touch-and-slide. Referring to Fig. 6D, when a predetermined touch is applied to an image object 641 corresponding to a "first phone", a window 642 for the "first phone" control is displayed on the touch screen 151. On the window, icons corresponding to another "second phone" or "mobile terminal" paired with the "first phone" can be displayed. When one of the icons is selected, a call received at the first telephone may be connected to a "mobile terminal" corresponding to the selected icon 642a. In a variant, when an icon 642a is selected and then a touch is applied to a virtual button 643 for automatic call forwarding, a control command associated with the connection of a received call to the first telephone to a corresponding mobile terminal at the selected icon 642a can be formed. Subsequently, with reference to Fig. 6E, when a predetermined type of touch (e.g., a gap nip) is applied to a window 642 for "first phone" control, information 644 about Call history of the "first phone" can be displayed. Heretofore, a method of allowing the controller 180 to individually control external devices, respectively, has been described. On the other hand, the external devices can be collectively controlled. Hereinafter, this will be described in more detail with reference to the accompanying drawings. Figs. 7A and 7B are conceptual views for explaining a window for controlling a plurality of external devices associated with the present disclosure. Referring to FIG. 7A, when a predetermined type of touch (for example, a fast hit in a downward direction) is applied in a state where a map image 711 for a partial region of a predetermined location is displayed on the touch screen 151, a window 712 for the control associated with the temperature, humidity, lighting, and the like of the partial region can be displayed. More specifically, the window 712 may include at least one of a temperature control bar 713, a humidity control bar 714 and a lighting control bar 715. For example, when a predetermined type of touch is applied to the temperature control bar 713, the temperature of a partial region can be controlled through an external device such as an air conditioner, a fan or the like. disposed in the partial region of the predetermined location. In addition, with reference to Fig. 7B, when a predetermined type of touch (for example, a fast hit in an upward direction) is applied in a state where a map image 711 for a partial region of a location predetermined is displayed on the touch screen 151, a list of external devices disposed in the region and the status information 716 of the external devices can be displayed. On the other hand, although not shown in the drawing, when a two-finger quick hit is applied, a list of external devices arranged at the entire predetermined location and the status information of the external devices can be displayed. . On the other hand, when a request from the user to complete the check on a predetermined location is received, the controller 180 may control the touch screen 151 to not display the control screen. In other words, when a request from the user to complete the check on a predetermined location is received, the controller 180 according to the present disclosure can control the touch screen 151 to display a home screen. Figs. 8A and 8B are conceptual views for explaining a control method associated with the end of a control screen from a predetermined location. Referring to FIG. 8A, when a predetermined type of touch is applied in a state where the control screen 811 of a predetermined location is displayed on the touch screen 151, the screen can be switched from the 811 control screen to 812 home screen. In other words, as illustrated in the drawing, when a touch-and-slide, with an edge of the touch screen 151 as the starting point and another edge thereof as the end point, is applied, the 811 control screen can be completed. Here, a representative icon 812a linked to the control screen may be displayed or icons 813, 814, 815, 816, 817 corresponding to each region of the predetermined location may be displayed on the home screen 812 based on the direction of the touch-drag applied. For example, referring to the third and fourth drawings of Fig. 8A, when a touch-and-slide is applied in one direction, in a state where the control screen 811 is displayed, a representative icon 812a linked to the screen control can be displayed in a region adjacent to a position at which the touch-drag is completed on the home screen 812. Furthermore, with reference to the first and second drawings of Fig. 8A, when a touch-and-slide is applied in a direction opposite to a direction in a state where the control screen 811 is displayed, icons 813, 814,815, 816, 817 corresponding to each region can be displayed in a region adjacent to a position at which the touch-drag is completed on the home screen 812. For example, icons 813, 814, 815, 816, 817 corresponding to each region of the predetermined location may be icons corresponding to a "room", a "living room", a "kitchen", and "toilets" To a "user's home". In addition, the icon 814 corresponding to a region in which the body is currently located among the regions can be further displayed. On the other hand, according to the present embodiment, it is illustrated that a representative icon 812a is displayed when a touch-and-drag is applied in the left direction, and a plurality of icons 813, 814, 815, 816, 817 corresponding to each region are displayed when touch-drag is applied in the right direction in a state where the control screen is displayed, but the present disclosure may not necessarily be limited to this. According to another embodiment, any one of the representative icon 812a and the icons 813, 814, 815, 816, 817 corresponding to each region can be controlled to be displayed as a function of a length of the trajectory in which touch-drag is applied. In addition, according to yet another embodiment, it may be possible to display a plurality of icons corresponding to each external device instead of a plurality of icons 813, 814, 815, 816, 817 corresponding to each region. . When an event occurs on an external device in a state where the home screen 812 is displayed, a notification icon to signal the event can be displayed at a position overlapping, or adjacent to, the corresponding icon 814 to a region in which the external device is disposed. Referring to the first and second drawings of Fig. 8B, when a predetermined type of touch (e.g., a touch-and-drag) is applied to an icon 814 corresponding to the region, an icon 821 corresponding to an external device on which the event has occurred can be displayed. Further, when there are a plurality of external devices on which events have occurred, if a touch-and-drag trajectory length increases, then icons 821, 822 corresponding to the plurality of external devices, respectively, on which the events occur. have occurred can be displayed. Referring to the third drawing of Figure 8B, the plurality of icons 821, 822 may be displayed in a sequence corresponding to external devices recently used by the user. Alternatively, the plurality of icons 821, 822 may be displayed in a sequence corresponding to external devices frequently used by the user. Referring to the fourth drawing of Fig. 8B, an icon 823 corresponding to an external device on which an event has not occurred can also be displayed together with the icons 821, 822 corresponding to external devices on which events have occurred. . On the other hand, according to the present disclosure, information about an event occurring on an external device may be displayed even in a state where a home screen is displayed on the touch screen. In addition, when a predetermined type of touch is applied to the event information, it may be possible to enter a control screen capable of controlling the external device or immediately controlling the external device. This will be described in more detail with reference to the accompanying drawings. Figs. 9A to 9E are conceptual views for explaining various embodiments of displaying information about an event occurring on an external device. Referring to Fig. 9A, when an event occurs on at least one external device disposed at a predetermined location to which the body is located, a notification icon 210 to signal the occurrence of an event may be displayed on a control bar. status display. When a predetermined type of touch-drag is applied to the notification icon 210, information about an event occurring on the external device may be displayed. Alternatively, when an event occurs on an external device disposed at a predetermined location at which the body is located in a state where a home screen 911 is displayed on the touch screen 151, information about the event may be displayed. For example, when an event indicating that a missed call has been received occurs on a telephone, event information 912 indicating that the missed call has been received can be displayed to overlap a home screen. In addition, when a predetermined type of touch is applied to the event information, the detailed information of the event information can be displayed. Detailed information may be information about an initiator of the missed call, a time when an event of the missed call has occurred. On the other hand, referring to FIG. 9B, when an event associated with an out-of-call occurs on the mobile terminal 100, the controller 180 displays information 921 associated with the event and an icon 922 indicating that the control of an external device at a location where the handheld is located is enabled. When a touch is applied to the icon 922, icons 923,924 corresponding to each region of locations in which the mobile terminals are located can be displayed. In addition, when an icon corresponding to a region is selected, icons 924a, 924b corresponding to a plurality of telephones installed in the region may be displayed. When any one of the plurality of icons 924a, 924b is selected, a call connection signal may be transmitted to a terminal of the initiator of the missed call via a telephone corresponding to the selected icon 924b. On the other hand, event information 921 may be information about an event associated with an out-of-call on any telephone at the predetermined location. In this case, when the icon 922 is selected, a call connection signal can be transmitted to a terminal of the initiator of the missed call via the mobile terminal 100. Alternatively, when a predetermined type of touch-and-slide is applied to the icon 922, icons 923, 924 corresponding to a plurality of telephones installed at the predetermined location may be displayed. When a touch is applied to any one of the icons, a call connection signal may be transmitted to a terminal of the initiator of the missed call via a telephone corresponding to an icon. on which the touch is applied. Referring to Fig. 9C, notification information 932 associated with reserved TV programs may be displayed in a state where a home screen 931 is displayed on the touch screen 151. When a touch is applied to the information 932 notification, a window 933 to select an application to display the reserved programs can be displayed. When any of the applications is selected through the window 933, the reserved programs 934 can be displayed on the touch screen 151 as videos via the selected application. Referring to Fig. 9D, when an event occurs on an external device in a state where a control screen 941 for controlling the external device at a predetermined location at which the mobile terminal is located is displayed on the touch screen 151, the controller 180 controls the touch screen 151 to display notification information 942 indicating that the event has occurred. When a touch is applied to the notification information 942, information 943 about an external device on which the event has occurred can be displayed. When there are a plurality of external devices on which the event has occurred, information regarding the plurality of external devices may be displayed in list form. When a hit is applied to the information 943 about an external device on which the event has occurred, the detailed information 944 of the event may be displayed. Referring to FIG. 9E, when an event occurs on the external device in a state where a control screen 951 for control of an external device at a predetermined location is displayed on the touch screen 151, the controller 180 controls the touch screen 151 to display notification information 952 indicating that the event has occurred. When a touch is applied to the notification information 952, icons 953, 954 corresponding to external devices on which the event has occurred can be displayed. When a touch is applied to any one of the 953 icons, the detailed information 955 of the event on an external device corresponding to the icon may be displayed. Therefore, a user immediately recognizes an event occurring on an external device regardless of the type of screen currently displayed. On the other hand, when the notification mode is set to a sensitive mode, the mobile terminal 100 associated with the present disclosure may display the notification information of an external device using a plurality of different notification panels. More specifically, the plurality of notification panels may include first and second notification panels. Notification information associated with applications installed in the mobile terminal 100 may be displayed on the first notification panel, and notification information of an external device associated with the present disclosure may be displayed on the second notification panel. Figs. 10A to 10D are conceptual views for explaining a method of displaying the notification information of an external device when a notification mode according to an embodiment of the present disclosure is set to a sensitive mode. According to the present disclosure, when the notification mode is set to a sensitive mode, the notification information may not be displayed on the touch screen 151 even when an event occurs on the external device. That is, when the notification mode is set to a sensitive mode, the mobile terminal 100 associated with the present disclosure may not display the notification information on the first notification panel even though the notification information of an external device is received. Then, the controller 180 may display the second notification panel on which the notification information on the touch screen 151 is displayed based on a touch-and-drag started from a region of the first notification panel (e.g. a region adjacent to an edge of the first notification panel) and applied in one direction. On the other hand, the present disclosure may display the second notification panel on which the notification information of an external device is displayed in various ways on the touch screen 151. For example, when a region 1011 of the first notification panel 1010 is touched in a state where the first notification panel 1010 is displayed, as shown in the first drawing of Fig. 10A, the controller 180 may display an icon of 1010a notification associated with an external device in a region, as illustrated in the second drawing of Figure 10A. Here, the notification icon 1010a associated with an external device may be an icon of an application for making access to a control screen for controlling an external device. Then, the controller 180 may display the second notification panel 1020 including (on which are displayed) the notification information 1021a, 1021b of external devices on the touch screen 151 on the touch screen 151 as a function of a touch-and-slide started from the icon 1010a and applied in one direction (e.g., downward direction) as shown in the third drawing of FIG. 10A. Here, the notification information 1021a, 1021b of the external device may be notification information to signal an event occurring on the external device disposed at a location where the body is located. For example, they may be notification information of an event associated with a "missed call" occurring on a telephone set at a location where the body is located. For example, as shown in the first drawing of Figure 10B, the first notification panel 1010 may include a specific region 1030 on which notification information is displayed. When a notification information size (or a number of notification information) displayed in the specific region exceeds a size of the specific region 1030, the controller 180 can scroll according to a touch-and-drag applied to the specific region 1030, as illustrated in the first and second drawings of Figure 10B. When a touch-drag is applied to the specific region in a direction (upward direction) opposite to a direction (downward direction) in a state where notification information corresponding to the last order is displayed in the specific region 1030, the Controller 180 may display a second notification panel on which the notification information of an external device on the touch screen 151 is displayed. Specifically, when a touch-and-drag is applied to the opposite direction (upward direction) at a first distance (dl) in a state where notification information 1031 corresponding to the last order is displayed in the specific region 1030, as illustrated. in the second drawing of Figure 10B, the controller 180 may display a graphical image 1032 indicating that the touch-and-slide scrolling applied to the opposite direction is disabled in the specific region 1030 in the specific region 1030. Then, when the touch-and-slide (touch-and-slide applied in an upward direction) is applied at a second distance (d2) greater than the first distance (d1), as illustrated in the third drawing of Figure 10B, the controller 180 may display an icon 1034a associated with external devices on the graphical image 1032. Then, when the touch-and-slide (touch-and-slide applied in an upward direction) is applied at a third distance (d3) greater than the second distance (d2), as illustrated in the fourth drawing of Figure 10B, the Controller 180 may display a second notification panel 1020 including the notification information 1021a, 1021b of the external devices on the touch screen 151. At this time, a portion of the second notification panel 1020 may be displayed on the touch screen 151. Then, when a touch-and-slide applied at a greater distance than the third distance is released, the controller 180 may display the second notification panel 1020 including (on which are displayed) the notification information 1021, 1021b of external devices on the touch screen 151, as shown in the fifth drawing of Figure 10B. On the other hand, the present disclosure may display a second notification panel indicating the notification information of external devices on the touch screen even when the first entire notification panel is not displayed. For example, when a touch is applied to a region 220 (e.g., status display bar) of the touch screen 151, as shown in the first drawing of Fig. 10C, a portion 1010a of the first notification panel can be displayed in a region 220 of the touch screen 151. Part 1010a of the first notification panel may be displayed while touch is maintained on the touch screen 151. When a touch having a predetermined profile is consecutively applied to the touch in a state where a portion 1010a of the first notification panel is displayed (in a state where the touch is held), as shown in the second drawing of the figure 10C, the portion 1010a of the first notification panel may be changed to a portion of the second notification panel 1020a, as shown in the third drawing of Figure 10C. The predetermined profile may be a profile of advance in one direction and then return in a direction opposite to a direction, as illustrated in the second drawing of Figure 10C. When a touch having a predetermined profile is consecutively applied to the touch, the controller 180 can change the portion 1010a of the first notification panel displayed in a region 220 of the touch screen 151 to a portion 1020a of the second notification panel capable of display the notification information of external devices. Then, the controller 180 may display the second notification panel 1020 on the touch screen 151, as illustrated in the fourth drawing of Fig. 10C, based on a touch-and-drag consecutively applied to the touch having the predetermined profile. in a direction in a state where the portion 1020a of the second notification panel is displayed (a touch having the predetermined profile is maintained), as shown in the third drawing of Figure 10C. The notification information 1021a, 1021b of external devices on which events have occurred can be displayed on the second notification panel 1020, as shown in the fourth drawing of Fig. 10C. For example, when a touch-drag starting from at least two points is applied to a region (e.g., status display bar) of the touch screen 151, the controller 180 may display the second notification panel instead of the first notification panel on the touch screen 151. Referring to Fig. 10D, when a touch is applied to at least two points in a region 220 of the touch screen 151, as shown in the first drawing of Fig. 10D, the controller 180 may display the portion 1020a of the second notification panel instead of the first notification panel (or part of the first notification panel) in a region 330, as shown in the second drawing of Figure 10D. Then, when a touch-drag (ie, a touch-drag starting from at least two points) applied in one direction is consecutively applied to touches applied to the at least two points, the controller 180 may display the second 1020 notification panel on which are displayed the notification information 1021a, 1021b of external devices on the touch screen 151, as shown in the second drawing of Figure 10D. With the above configuration, it may be possible to provide various user interfaces capable of displaying a second notification panel on which the notification information of external devices, on which events have occurred, is displayed on the touch screen. In a mobile terminal according to the present disclosure, when a predetermined type of touch is applied to the touch screen depending on the terminal body making a communication with an external communication device installed at a predetermined location, a control screen associated with the predetermined place can be displayed. As a result, the user can more conveniently enter the associated control screen at the predetermined location. The control screen may include map images associated with the predetermined location and image objects corresponding to external devices arranged at the predetermined location. The map image and the image objects can provide a UI / UX capable of allowing the user to more intuitively use the control screen. In addition, map images associated with the predetermined location may be map images for various regions at the predetermined location. In addition, map images for the various regions can be converted to others. As a result, the user can use the control screen more efficiently through various map images. In addition, when a predetermined type of touch is applied to the image object, a control command for the image object can be formed. As a result, the user can more conveniently control external devices arranged at the predetermined location via the control screen. The present invention may be implemented in the form of computer-readable codes on a medium written by the program. Computer readable media may include all types of recording devices in which data readable by a computer system is stored. Examples of the computer readable media may include a ROM, a RAM, a CD-ROM, a magnetic tape, a diskette, and an optical data storage device, and the like, and also include a device implemented in the form of a computer. carrier (for example, transmission via the Internet). In addition, the computer may include the controller 180 of the mobile terminal. Therefore, the detailed description of the invention should not be construed as being restrictive in all respects but considered illustrative. The scope of the invention is to be determined by reasonable interpretation of the appended claims and any changes that are within the equivalent scope of the invention are included within the scope of the invention. One aspect of the present disclosure is to provide a mobile terminal and an associated control method capable of performing associated control at a predetermined location. Another aspect of the present disclosure is to provide a mobile terminal and an associated control method capable of performing control associated with a plurality of predetermined locations. Yet another aspect of the present disclosure is to provide a mobile terminal and an associated control method capable of performing control associated with a predetermined location at which a terminal body is currently located among a plurality of predetermined locations. In order to accomplish the foregoing tasks, a mobile terminal according to one embodiment may include a body; a wireless communication unit configured to communicate with an external communication device; a touch screen formed to display a home screen and detect a touch; and a controller configured to control the touch screen to display a control screen to control at least a portion of external devices at a location where the body is located in accordance with the external communication device connected through the remote control. wireless communication unit when a predetermined type of touch is applied to the touch screen while displaying the home screen. According to one embodiment, the predetermined type of touch may be a multi-touch applied to at least two or more touch points on the touch screen. According to one embodiment, the multi-touch may be a nip-pinch to increase a relative distance between two touch points on the touch screen. According to one embodiment, a first control screen for controlling at least a portion of external devices arranged at a first location may be displayed on the touch screen when the body is located at the first location, and a second control screen for controlling at least a portion of external devices disposed at a second location may be displayed on the touch screen when the body is located at the second location different from the first location. According to one embodiment, when the wireless communication unit is connected to the wireless external communication device, the controller can control the touch screen to display notification information indicative of the wireless connection. According to one embodiment, the control screen may include a map image for at least a partial region of the predetermined location. Here, the map image may include a first map image for a first region of the predetermined location and a second map image for a second region of the predetermined location, and the first region and the second region may be different regions or regions in which any region contains another region. In addition, when a predetermined type of touch is applied to the touch screen in a state where any map image of the first and second map images is displayed thereon, the controller may control the touch screen to display another cartographic image. In addition, the map image may be a three-dimensional map image for a partial region of the predetermined location, and the three-dimensional map image may be a three-dimensional map image pre-stored in the body or an image received in real time from a camera for capturing views provided in the body. Here, the three-dimensional map image can be displayed on the touch screen to correspond to at least one of a current location of the detected body within the predetermined location and a direction directed from the body. Here, when a predetermined type of touch is applied to the three-dimensional map image, the controller can control the touch screen to display a plan view image for a partial region containing a current location of the body within the region. predetermined place. In addition, the control screen may include at least one of an image object corresponding to an external device disposed at the predetermined location and overlapping the map image and the status information of the external device. Here, the status information of the external device may be displayed at a position overlapping, or adjacent to, an image object corresponding to the external device. Here, the controller may generate a control command for controlling the external device according to a predetermined type of touch applied to the image object. According to one embodiment, when the image object is an image object corresponding to a TV set, the controller can receive image information displayed on the TV set via the wireless communication unit, and displaying the image information via the touch screen according to a predetermined type of touch applied to the image object. According to one embodiment, when a predetermined type of touch is applied to the control screen in a state where the control screen is displayed on the touch screen, the controller can control the touch screen to display the touch screen. home screen, and the home screen includes an icon linked to the control screen. According to one embodiment, the icon linked to the control screen may include a plurality of icons corresponding to a plurality of regions at a predetermined location. According to one embodiment, when an event occurs on an external device in a state where the home screen is displayed on the touch screen, the controller may display a notification icon to report the event to a position. overlapping, or adjacent to, an icon corresponding to a region in which the external device is disposed among the plurality of icons. A method of controlling a mobile terminal according to an embodiment of the present disclosure may include performing communication with an external communication device installed at a predetermined location via a wireless communication unit; and controlling a touch screen to display a control screen for performing a control associated with a predetermined location when a predetermined type of touch is applied to the touch screen after communicating with the external communication device based on of a body located at the predetermined place. Any reference in this memo to "an embodiment", "illustrative embodiment", etc., means that a particular feature, structure, or feature described in connection with the embodiment is included in at least one embodiment. embodiment of the invention. Apparitions of such expressions in various places in the memory are not necessarily all with reference to the same embodiment. In addition, when a particular feature, structure, or feature is described in association with any embodiment, it is believed that one of ordinary skill in the art will be able to perform such a feature, structure, or feature in combination with other embodiments. Although embodiments have been described with reference to a number of illustrative embodiments of the invention, it should be understood that many other modifications and many other embodiments may be devised, by those skilled in the art, that will be within the spirit and scope of the principles of this disclosure. More particularly, various variations and modifications are possible in the component parts and / or component arrangements of the present association arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in component parts and / or component arrangements, variations of use will also be apparent to those skilled in the art.
权利要求:
Claims (15) [1" id="c-fr-0001] A mobile terminal (100), comprising: a body; a wireless communication unit (110) configured to communicate with an external communication device that is coupled to a plurality of external devices located at a predetermined space; a touch screen (151) configured to display a home screen and to detect a touch; and a controller (180) configured to control the touch screen (151) to display a control screen for controlling one or more of the external devices at the predetermined location, wherein the control screen is displayed when a predetermined type touch is applied to the touch screen (151) during the home screen display, and the body of the mobile terminal (100) is determined to be at the predetermined space depending on the connected external communication device via the wireless communication unit (110). [2" id="c-fr-0002] The mobile terminal (100) according to claim 1, wherein the predetermined type of touch is a multi-touch applied to at least two or more contact points on the touch screen (151). [3" id="c-fr-0003] The mobile terminal (100) according to claim 2, wherein the multi-touch is a nip press to increase a relative distance between two touch points on the touch screen (151). [4" id="c-fr-0004] The mobile terminal (100) of claim 2, wherein the multi-touch maintains a predetermined number of two or more touch points on the touch screen (151) for more than a predetermined period. [5" id="c-fr-0005] A mobile terminal (100) according to claim 1, wherein a first control screen for controlling at least one external device disposed at a first location is displayed on the touch screen (151) when the body is located at the first location, and a second control screen for controlling at least one external device disposed at a second location is displayed on the touch screen (151) when the body is located at the second location different from the first location. [6" id="c-fr-0006] The mobile terminal (100) of claim 1, wherein, when the wireless communication unit (110) is connected to the wireless external communication device, the controller (180) controls the touch screen (151) for display notification information indicating the wireless connection. [7" id="c-fr-0007] The mobile terminal (100) of claim 1, wherein the control screen comprises a map image for at least a partial region of the predetermined location. [8" id="c-fr-0008] The mobile terminal (100) according to claim 7, wherein the map image comprises a first map image for a first region of the predetermined location and a second map image for a second region of the predetermined location, and the first region and the second region are different regions or regions in which one region includes another region. [9" id="c-fr-0009] The mobile terminal (100) according to claim 8, wherein when a predetermined type of touch is applied to the touch screen (151) in a state in which the first map image or the second map image is displayed, the controller (180) controls the touch screen (151) to display another image. [10" id="c-fr-0010] The mobile terminal (100) according to claim 7, wherein a detection unit which detects an angle of the body with respect to the ground is provided in the body, and the controller (180) controls the touch screen (151) to display a first type of map image when the angle detected by the detection unit is greater than a predetermined angle, and controls the touch screen (151) to display a second type of map image when the angle detected by the detection unit is less than a predetermined angle. [11" id="c-fr-0011] The mobile terminal (100) of claim 7, wherein the map image is a three-dimensional map image for a partial region of the predetermined location, and the three-dimensional map image is a three-dimensional map image pre-stored in the body. or an image received in real time from a view capturing apparatus provided in the body. [12" id="c-fr-0012] The mobile terminal (100) of claim 11, wherein the three-dimensional map image is displayed on the touch screen (151) to correspond to at least one of a current location of the detected body within the predetermined location or direction relative to the body. [13" id="c-fr-0013] The mobile terminal (100) according to claim 7, wherein the control screen comprises at least one of an image object corresponding to an external device disposed at the predetermined location and overlapping the map image or information of state of a corresponding external device. [14" id="c-fr-0014] The mobile terminal (100) of claim 13, wherein the state information is displayed at a position overlapping, or adjacent to, the image object of the corresponding external device. [15" id="c-fr-0015] The mobile terminal (100) of claim 1 wherein, when a predetermined type of touch is applied to the control screen in a state in which the control screen is displayed on the touch screen (151) the controller (180) controls the touch screen (151) to display the home screen, and the home screen includes an icon related to the control screen.
类似技术:
公开号 | 公开日 | 专利标题 FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF FR3041447A1|2017-03-24| FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL US10564675B2|2020-02-18|Mobile terminal and control method therefor FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021425A1|2015-11-27| FR3022649A1|2015-12-25| FR3026201A1|2016-03-25| FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME EP2999128B1|2018-10-24|Mobile terminal and control method therefor KR102135367B1|2020-07-17|Mobile terminal and control method for the mobile terminal FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3040221A1|2017-02-24| FR3022367A1|2015-12-18| FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3046470B1|2019-11-08|MOBILE TERMINAL FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3019665A1|2015-10-09| US20160054567A1|2016-02-25|Mobile terminal, glasses-type terminal, and mutual interworking method using screens thereof FR3021135A1|2015-11-20|
同族专利:
公开号 | 公开日 EP3151104A1|2017-04-05| FR3041447B1|2020-11-20| CN107040642B|2021-02-12| US20170083268A1|2017-03-23| CN107040642A|2017-08-11| KR20170035678A|2017-03-31|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US7649456B2|2007-01-26|2010-01-19|Sony Ericsson Mobile Communications Ab|User interface for an electronic device used as a home controller| US8467991B2|2008-06-20|2013-06-18|Microsoft Corporation|Data services based on gesture and location information of device| US20110088068A1|2009-10-09|2011-04-14|Sony Ericsson Mobile Communications Ab|Live media stream selection on a mobile device| JP5338821B2|2011-01-25|2013-11-13|コニカミノルタ株式会社|Image forming apparatus, terminal device, image forming system, and control program| KR20130048533A|2011-11-02|2013-05-10|엘지전자 주식회사|Method for operating a remote controller| WO2013122576A1|2012-02-15|2013-08-22|Thomson Licensing|Proximity based self learning remote| US8464181B1|2012-07-03|2013-06-11|Google Inc.|Floor selection on an interactive digital map| KR101949737B1|2012-08-28|2019-02-19|엘지전자 주식회사|Mobile terminal and controlling method therof, and recording medium thereof| US9196139B2|2012-09-12|2015-11-24|ACCO Brands Corporation|Proximity tag for object tracking| JP6476117B2|2012-09-21|2019-02-27|ホーム コントロール シンガポール プライベート リミテッド|Handheld information processing device with remote control output mode| US9652115B2|2013-02-26|2017-05-16|Google Inc.|Vertical floor expansion on an interactive digital map| TWI526878B|2013-10-04|2016-03-21|大同股份有限公司|Method for controlling electronic apparatus, handheld electronic apparatus and monitoring system| US20150277571A1|2014-03-31|2015-10-01|Kobo Incorporated|User interface to capture a partial screen display responsive to a user gesture| US20160062617A1|2014-09-02|2016-03-03|Google Inc.|Map Zooming Based on Semantic Meaning|KR102130797B1|2013-09-17|2020-07-03|엘지전자 주식회사|Mobile terminal and control method for the mobile terminal| USD790595S1|2014-12-22|2017-06-27|Adtile Technologies Inc.|Display screen with a transitional motion-activated graphical user interface| USD806728S1|2015-04-21|2018-01-02|Harsco Technologies LLC|Display screen with graphical user interface for a boiler| US10517159B2|2016-04-15|2019-12-24|Shenzhen Royole Technologies Co., Ltd.|Control device| US10110678B2|2016-08-19|2018-10-23|Sony Corporation|System and method for data communication based on image processing| KR20180097977A|2017-02-24|2018-09-03|삼성전자주식회사|Method and apparatus for controlling a plurality of internet of things devices| US10721518B2|2017-09-22|2020-07-21|Enseo, Inc.|Set-top box with interactive features and system and method for use of same| USD918952S1|2018-10-19|2021-05-11|Beijing Xiaomi Mobile Software Co., Ltd.|Electronic device with graphical user interface|
法律状态:
2017-09-25| PLFP| Fee payment|Year of fee payment: 2 | 2018-09-20| PLFP| Fee payment|Year of fee payment: 3 | 2019-02-15| PLSC| Search report ready|Effective date: 20190215 | 2019-07-29| PLFP| Fee payment|Year of fee payment: 4 | 2020-08-21| PLFP| Fee payment|Year of fee payment: 5 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020150134878A|KR20170035678A|2015-09-23|2015-09-23|Mobile terminal and method of controlling the same| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|